Convergence Results for a Class of Abstract Continuous Descent Methods

نویسندگان

  • SERGIU AIZICOVICI
  • ALEXANDER J. ZASLAVSKI
چکیده

We study continuous descent methods for the minimization of Lipschitzian functions defined on a general Banach space. We establish convergence theorems for those methods which are generated by approximate solutions to evolution equations governed by regular vector fields. Since the complement of the set of regular vector fields is σ-porous, we conclude that our results apply to most vector fields in the sense of Baire’s categories.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Unifying abstract inexact convergence theorems for descent methods and block coordinate variable metric iPiano

An abstract convergence theorem for a class of descent method that explicitly models relative errors is proved. The convergence theorem generalizes and unifies several recent abstract convergence theorems, and is applicable to possibly non-smooth and non-convex lower semi-continuous functions that satisfy the Kurdyka– Lojasiewicz inequality, which comprises a huge class of problems. The descent...

متن کامل

Barrier Operators and Associated Gradient-Like Dynamical Systems for Constrained Minimization Problems

We study some continuous dynamical systems associated with constrained optimization problems. For that purpose, we introduce the concept of elliptic barrier operators and develop a unified framework to derive and analyze the associated class of gradient-like dynamical systems, called A-Driven Descent Method (A-DM). Prominent methods belonging to this class include several continuous descent met...

متن کامل

A new Levenberg-Marquardt approach based on Conjugate gradient structure for solving absolute value equations

In this paper, we present a new approach for solving absolute value equation (AVE) whichuse Levenberg-Marquardt method with conjugate subgradient structure. In conjugate subgradientmethods the new direction obtain by combining steepest descent direction and the previous di-rection which may not lead to good numerical results. Therefore, we replace the steepest descentdir...

متن کامل

Two Convergence Results for Continuous Descent Methods

We consider continuous descent methods for the minimization of convex functionals defined on general Banach space. We establish two convergence results for methods which are generated by regular vector fields. Since the complement of the set of regular vector fields is σ-porous, we conclude that our results apply to most vector fields in the sense of Baire’s categories.

متن کامل

Residual norm steepest descent based iterative algorithms for Sylvester tensor equations

Consider the following consistent Sylvester tensor equation[mathscr{X}times_1 A +mathscr{X}times_2 B+mathscr{X}times_3 C=mathscr{D},]where the matrices $A,B, C$ and the tensor $mathscr{D}$ are given and $mathscr{X}$ is the unknown tensor. The current paper concerns with examining a simple and neat framework for accelerating the speed of convergence of the gradient-based iterative algorithm and ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004